Solving nonsmooth convex optimization with complexity O(ε)

نویسندگان

  • Masoud Ahookhosh
  • Arnold Neumaier
چکیده

This paper describes an algorithm for solving structured nonsmooth convex optimization problems using the optimal subgradient algorithm (OSGA), which is a first-order method with the complexity O(ε) for Lipschitz continuous nonsmooth problems and O(ε) for smooth problems with Lipschitz continuous gradient. If the nonsmoothness of the problem is manifested in a structured way, we reformulate the problem in a form that can be solved efficiently by OSGA with the complexity O(ε). To solve the reformulated problem, we equip OSGA by an appropriate prox-function for which the OSGA subproblem can be solved either in a closed form or by a simple iterative scheme, which decreases the computational cost of applying the algorithm, especially for large-scale problems. We show that applying the new scheme is feasible for many problems arising in applications. Some numerical results are reported.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An efficient one-layer recurrent neural network for solving a class of nonsmooth optimization problems

Constrained optimization problems have a wide range of applications in science, economics, and engineering. In this paper, a neural network model is proposed to solve a class of nonsmooth constrained optimization problems with a nonsmooth convex objective function subject to nonlinear inequality and affine equality constraints. It is a one-layer non-penalty recurrent neural network based on the...

متن کامل

Accelerated gradient sliding for structured convex optimization

Our main goal in this paper is to show that one can skip gradient computations for gradient descent type methods applied to certain structured convex programming (CP) problems. To this end, we first present an accelerated gradient sliding (AGS) method for minimizing the summation of two smooth convex functions with different Lipschitz constants. We show that the AGS method can skip the gradient...

متن کامل

Smoothing and First Order Methods: A Unified Framework

We propose a unifying framework that combines smoothing approximation with fast first order algorithms for solving nonsmooth convex minimization problems. We prove that independently of the structure of the convex nonsmooth function involved, and of the given fast first order iterative scheme, it is always possible to improve the complexity rate and reach an O(ε−1) efficiency estimate by solvin...

متن کامل

Worst-Case Complexity of Smoothing Quadratic Regularization Methods for Non-Lipschitzian Optimization

Abstract. In this paper, we propose a smoothing quadratic regularization (SQR) algorithm for solving a class of nonsmooth nonconvex, perhaps even non-Lipschitzian minimization problems, which has wide applications in statistics and sparse reconstruction. The proposed SQR algorithm is a first order method. At each iteration, the SQR algorithm solves a strongly convex quadratic minimization probl...

متن کامل

Smoothing Sqp Algorithm for Non-lipschitz Optimization with Complexity Analysis

Abstract. In this paper, we propose a smoothing sequential quadratic programming (SSQP) algorithm for solving a class of nonsmooth nonconvex, perhaps even non-Lipschitz minimization problems, which has wide applications in statistics and sparse reconstruction. At each step, the SSQP algorithm solves a strongly convex quadratic minimization problem with a diagonal Hessian matrix, which has a sim...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015